17 research outputs found

    Developing and testing a Corona VaccinE tRiAL pLatform (COVERALL) to study Covid-19 vaccine response in immunocompromised patients

    Full text link
    BACKGROUND The rapid course of the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic calls for fast implementation of clinical trials to assess the effects of new treatment and prophylactic interventions. Building trial platforms embedded in existing data infrastructures is an ideal way to address such questions within well-defined subpopulations. METHODS We developed a trial platform building on the infrastructure of two established national cohort studies: the Swiss human immunodeficiency virus (HIV) Cohort Study (SHCS) and Swiss Transplant Cohort Study (STCS). In a pilot trial, termed Corona VaccinE tRiAL pLatform (COVERALL), we assessed the vaccine efficacy of the first two licensed SARS-CoV-2 vaccines in Switzerland and the functionality of the trial platform. RESULTS Using Research Electronic Data Capture (REDCap), we developed a trial platform integrating the infrastructure of the SHCS and STCS. An algorithm identifying eligible patients, as well as baseline data transfer ensured a fast inclusion procedure for eligible patients. We implemented convenient re-directions between the different data entry systems to ensure intuitive data entry for the participating study personnel. The trial platform, including a randomization algorithm ensuring balance among different subgroups, was continuously adapted to changing guidelines concerning vaccination policies. We were able to randomize and vaccinate the first trial participant the same day we received ethics approval. Time to enroll and randomize our target sample size of 380 patients was 22 days. CONCLUSION Taking the best of each system, we were able to flag eligible patients, transfer patient information automatically, randomize and enroll the patients in an easy workflow, decreasing the administrative burden usually associated with a trial of this size

    Quantum Computing for High-Energy Physics: State of the Art and Challenges. Summary of the QC4HEP Working Group

    Full text link
    Quantum computers offer an intriguing path for a paradigmatic change of computing in the natural sciences and beyond, with the potential for achieving a so-called quantum advantage, namely a significant (in some cases exponential) speed-up of numerical simulations. The rapid development of hardware devices with various realizations of qubits enables the execution of small scale but representative applications on quantum computers. In particular, the high-energy physics community plays a pivotal role in accessing the power of quantum computing, since the field is a driving source for challenging computational problems. This concerns, on the theoretical side, the exploration of models which are very hard or even impossible to address with classical techniques and, on the experimental side, the enormous data challenge of newly emerging experiments, such as the upgrade of the Large Hadron Collider. In this roadmap paper, led by CERN, DESY and IBM, we provide the status of high-energy physics quantum computations and give examples for theoretical and experimental target benchmark applications, which can be addressed in the near future. Having the IBM 100 x 100 challenge in mind, where possible, we also provide resource estimates for the examples given using error mitigated quantum computing

    Development and implementation of an autonomous control system for target-optimised use of intralogistics transport systems in the Learning Factory Werk 150 at Reutlingen University

    No full text
    Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory

    Optimizing Quantum Classification Algorithms on Classical Benchmark Datasets

    No full text
    The discovery of quantum algorithms offering provable advantages over the best known classical alternatives, together with the parallel ongoing revolution brought about by classical artificial intelligence, motivates a search for applications of quantum information processing methods to machine learning. Among several proposals in this domain, quantum kernel methods have emerged as particularly promising candidates. However, while some rigorous speedups on certain highly specific problems have been formally proven, only empirical proof-of-principle results have been reported so far for real-world datasets. Moreover, no systematic procedure is known, in general, to fine tune and optimize the performances of kernel-based quantum classification algorithms. At the same time, certain limitations such as kernel concentration effects—hindering the trainability of quantum classifiers—have also been recently pointed out. In this work, we propose several general-purpose optimization methods and best practices designed to enhance the practical usefulness of fidelity-based quantum classification algorithms. Specifically, we first describe a data pre-processing strategy that, by preserving the relevant relationships between data points when processed through quantum feature maps, substantially alleviates the effect of kernel concentration on structured datasets. We also introduce a classical post-processing method that, based on standard fidelity measures estimated on a quantum processor, yields non-linear decision boundaries in the feature Hilbert space, thus achieving the quantum counterpart of the radial basis functions technique that is widely employed in classical kernel methods. Finally, we apply the so-called quantum metric learning protocol to engineer and adjust trainable quantum embeddings, demonstrating substantial performance improvements on several paradigmatic real-world classification tasks

    Extending the reach of quantum computing for materials science with machine learning potentials

    No full text
    Solving electronic structure problems represents a promising field of applications for quantum computers. Currently, much effort is spent in devising and optimizing quantum algorithms for near-term quantum processors, with the aim of outperforming classical counterparts on selected problem instances using limited quantum resources. These methods are still expected to feature a runtime preventing quantum simulations of large scale and bulk systems. In this work, we propose a strategy to extend the scope of quantum computational methods to large scale simulations using a machine learning potential trained on quantum simulation data. The challenge of applying machine learning potentials in today’s quantum setting arises from the several sources of noise affecting the quantum computations of electronic energies and forces. We investigate the trainability of a machine learning potential selecting various sources of noise: statistical, optimization, and hardware noise. Finally, we construct the first machine learning potential from data computed on actual IBM Quantum processors for a hydrogen molecule. This already would allow us to perform arbitrarily long and stable molecular dynamics simulations, outperforming all current quantum approaches to molecular dynamics and structure optimization.ISSN:2158-322

    Unravelling physics beyond the standard model with classical and quantum anomaly detection

    No full text
    Much hope for finding new physics phenomena at microscopic scale relies on the observations obtained from High Energy Physics experiments, like the ones performed at the Large Hadron Collider (LHC). However, current experiments do not indicate clear signs of new physics that could guide the development of additional Beyond Standard Model (BSM) theories. Identifying signatures of new physics out of the enormous amount of data produced at the LHC falls into the class of anomaly detection and constitutes one of the greatest computational challenges. In this article, we propose a novel strategy to perform anomaly detection in a supervised learning setting, based on the artificial creation of anomalies through a random process. For the resulting supervised learning problem, we successfully apply classical and quantum support vector classifiers (CSVC and QSVC respectively) to identify the artificial anomalies among the SM events. Even more promising, we find that employing an SVC trained to identify the artificial anomalies, it is possible to identify realistic BSM events with high accuracy. In parallel, we also explore the potential of quantum algorithms for improving the classification accuracy and provide plausible conditions for the best exploitation of this novel computational paradigm
    corecore